Accelerated Uzawa methods for convex optimization
نویسندگان
چکیده
منابع مشابه
Accelerated Methods for Non-Convex Optimization
We present an accelerated gradient method for non-convex optimization problems with Lipschitz continuous first and second derivatives. The method requires time O( −7/4 log(1/ )) to find an -stationary point, meaning a point x such that ‖∇f(x)‖ ≤ . The method improves upon the O( −2) complexity of gradient descent and provides the additional second-order guarantee that ∇f(x) −O( )I for the compu...
متن کاملAccelerated gradient sliding for structured convex optimization
Our main goal in this paper is to show that one can skip gradient computations for gradient descent type methods applied to certain structured convex programming (CP) problems. To this end, we first present an accelerated gradient sliding (AGS) method for minimizing the summation of two smooth convex functions with different Lipschitz constants. We show that the AGS method can skip the gradient...
متن کاملProximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates
In [19], a general, inexact, efficient proximal quasi-Newton algorithm for composite optimization problems has been proposed and a sublinear global convergence rate has been established. In this paper, we analyze the convergence properties of this method, both in the exact and inexact setting, in the case when the objective function is strongly convex. We also investigate a practical variant of...
متن کاملAn Accelerated Hybrid Proximal Extragradient Method for Convex Optimization and Its Implications to Second-Order Methods
This paper presents an accelerated variant of the hybrid proximal extragradient (HPE) method for convex optimization, referred to as the accelerated HPE (A-HPE) framework. Iterationcomplexity results are established for the A-HPE framework, as well as a special version of it, where a large stepsize condition is imposed. Two specific implementations of the A-HPE framework are described in the co...
متن کاملAccelerated Primal-Dual Proximal Block Coordinate Updating Methods for Constrained Convex Optimization
Block Coordinate Update (BCU) methods enjoy low per-update computational complexitybecause every time only one or a few block variables would need to be updated among possiblya large number of blocks. They are also easily parallelized and thus have been particularlypopular for solving problems involving large-scale dataset and/or variables. In this paper, wepropose a primal-...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Mathematics of Computation
سال: 2016
ISSN: 0025-5718,1088-6842
DOI: 10.1090/mcom/3145